Error Bounds Between Marginal Probabilities and Beliefs of Loopy Belief Propagation Algorithm
نویسندگان
چکیده
Belief propagation (BP) algorithm has been becoming increasingly a popular method for probabilstic inference on general graphical models. When networks have loops, it may not converge and, even if converges, beliefs, Le., the result of the algorithm, may not be equal to exact marginal probabilties. When networks have loops, the algorithm is called Loopy BP (LBP). Tatikonda and Jordan applied Gibbs measures theory to LBP algorithm and derived a suffcient convergence condition. In this paper, we utilize Gibbs measure theory to investigate the discrepancy between a marginal probabilty and the corresponding belief. Consequently, in particular, we obtain an error bound if the algorithm converges under a certain condition. It is a general result for the accuracy of the algorithm. We also perform numerical experiments to see the effectiveness of the result.
منابع مشابه
Message Error Analysis of Loopy Belief Propagation for the Sum-Product Algorithm
Belief propagation is known to perform extremely well in many practical statistical inference and learning problems using graphical models, even in the presence of multiple loops. The use of the belief propagation algorithm on graphical models with loops is referred to as Loopy Belief Propagation (LBP). Various sufficient conditions for convergence of LBP have been presented; however, general n...
متن کاملDiscriminated Belief Propagation
Near optimal decoding of good error control codes is generally a difficult task. However, for a certain type of (sufficiently) good codes an efficient decoding algorithm with near optimal performance exists. These codes are defined via a combination of constituent codes with low complexity trellis representations. Their decoding algorithm is an instance of (loopy) belief propagation and is base...
متن کاملLearning unbelievable marginal probabilities
Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we...
متن کاملLoopy belief propagation for approximate inference : an empiricalstudyKevin
Recently, a number of researchers have demonstrated excellent performance by using \loopy belief propagation" | using Pearl's polytree algorithm in a Bayesian network with loops. The most dramatic instance is the near Shannon-limit performance of \Turbo Codes" | error-correcting codes whose decoding algorithm is equivalent to loopy belief propagation. In this paper we ask: is there something sp...
متن کاملGraph zeta function and loopy belief propagation
This paper discusses a link between the loopy belief propagation (LBP) algorithm and the Graph zeta function. The LBP algorithm is a nonlinear iteration to approximate the marginal or posterior probabilities required for various statistical inference, using the graph structure to define the joint probability. The theoretical properties of the LBP algorithm are not easy to analyze because of the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006